293 research outputs found

    Dynamic ridge polynomial neural network with Lyapunov function for time series forecasting

    Get PDF
    The ability to model the behaviour of arbitrary dynamic system is one of the most useful properties of recurrent networks. Dynamic ridge polynomial neural network (DRPNN) is a recurrent neural network used for time series forecasting. Despite the potential and capability of the DRPNN, stability problems could occur in the DRPNN due to the existence of the recurrent feedback. Therefore, in this study, a su cient condition based on an approach that uses adaptive learning rate is developed by introducing a Lyapunov function. To compare the performance of the proposed solution with the existing solution, which is derived based on the stability theorem for a feedback network, we used six time series, namely Darwin sea level pressure, monthly smoothed sunspot numbers, Lorenz, Santa Fe laser, daily Euro/Dollar exchange rate and Mackey-Glass time-delay di erential equation. Simulation results proved the stability of the proposed solution and showed an average 21.45% improvement in Root Mean Square Error (RMSE) with respect to the existing solution. Furthermore, the proposed solution is faster than the existing solution. This is due to the fact that the proposed solution solves network size restriction found in the existing solution and takes advantage of the calculated dynamic system variable to check the stability, unlike the existing solution that needs more calculation steps

    Improved Performance of Secured VoIP Via Enhanced Blowfish Encryption Algorithm

    Get PDF
    Both the development and the integration of efficient network, open source technology, and Voice over Internet Protocol (VoIP) applications have been increasingly important and gained quick popularity due to new rapidly emerging IP-based network technology. Nonetheless, security and privacy concerns have emerged as issues that need to be addressed. The privacy process ensures that encryption and decryption methods protect the data from being alternate and intercept, a privacy VoIP call will contribute to private and confidential conversation purposes such as telebanking, telepsychiatry, health, safety issues and many more. Hence, this study had quantified VoIP performance and voice quality under security implementation with the technique of IPSec and the enhancement of the Blowfish encryption algorithm. In fact, the primary objective of this study is to improve the performance of Blowfish encryption algorithm. The proposed algorithm was tested with varying network topologies and a variety of audio codecs, which contributed to the impact upon VoIP network. A network testbed with seven experiments and network configurations had been set up in two labs to determine its effects on network performance. Besides, an experimental work using OPNET simulations under 54 experiments of network scenarios were compared with the network testbed for validation and verification purposes. Next, an enhanced Blowfish algorithm for VoIP services had been designed and executed throughout this research. From the stance of VoIP session and services performance, the redesign of the Blowfish algorithm displayed several significant effects that improved both the performance of VoIP network and the quality of voice. This finding indicates some available opportunities that could enhance encrypted algorithm, data privacy, and integrity; where the balance between Quality of Services (QoS) and security techniques can be applied to boost network throughput, performance, and voice quality of existing VoIP services. With that, this study had executed and contributed to a threefold aspect, which refers to the redesign of the Blowfish algorithm that could minimize computational resources. In addition, the VoIP network performance was analysed and compared in terms of end-to-end delay, jitter, packet loss, and finally, sought improvement for voice quality in VoIP services, as well as the effect of the designed enhanced Blowfish algorithm upon voice quality, which had been quantified by using a variety of voice codecs

    Financial time series prediction using spiking neural networks

    Get PDF
    In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments. © 2014 Reid et al

    Improving quality of life through the routine use of the Patient Concerns Inventory for head and neck cancer patients: a cluster preference randomized controlled trial

    Get PDF
    This trial is funded by the RfPB on behalf of the NIHR (PB-PG-0215-36047).Background: The consequences of treatment for Head and Neck cancer (HNC) patients has profound detrimental impacts such as impaired QOL, emotional distress, delayed recovery and frequent use of healthcare. The aim of this trial is to determine if the routine use of the Patients Concerns Inventory (PCI) package in review clinics during the first year following treatment can improve overall quality of life, reduce the social-emotional impact of cancer and reduce levels of distress. Furthermore, we aim to describe the economic costs and benefits of using the PCI. Methods: This will be a cluster preference randomised control trial with consultants either ‘using’ or ‘not using’ the PCI package at clinic. It will involve two centres Leeds and Liverpool. 416 eligible patients from at least 10 consultant clusters are required to show a clinically meaningful difference in the primary outcome. The primary outcome is the percentage of participants with less than good overall quality of life at the final one-year clinic as measured by the University of Washington QOL questionnaire version 4 (UWQOLv4). Secondary outcomes at one-year are the mean social-emotional subscale (UWQOLv4) score, Distress Thermometer (DT) score ≥ 4, and key health economic measures (QALY-EQ-5D-5 L; CSRI). Discussion: This trial will provide knowledge on the effectiveness of a consultation intervention package based around the PCI used at routine follow-up clinics following treatment of head and neck cancer with curative intent. If this intervention is (cost) effective for patients, the next step will be to promote wider use of this approach as standard care in clinical practice. Trial registration: 32,382. Clinical Trials Identifier, NCT03086629. Protocol: Version 3.0, 1st July 2017.Publisher PDFPeer reviewe

    Protocol for a systematic review of screening tools for fear of recurrent illness in common life threatening diseases

    Get PDF
    This is the authors' accepted version of an article published in Systematic Reviews, 2015.A myocardial infarction (MI) (‘heart attack’) can be intensely stressful, and the impact of this event can leave patients with clinically significant post-MI stress symptoms. Untreated stress can make heart disease worse. Few tools are available that screen for specific thoughts or beliefs that can trigger post-MI stress responses. In other life-threatening illnesses, fear of recurrence (FoR) of illness has been identified as a key stressor, and screening tools have been developed to identify this. The aim of this review is to identify FoR screening tools used in other common life-threatening diseases that report on the development of the tool, to assess if there are any that can be adapted for use in MI survivors so that those with high levels of FoR can be identified and helped

    Corporate reputation in the spanish context: An interaction between reporting to stakeholders and industry.

    Get PDF
    ABSTRACT: The authors describe the intensity and orientation of the corporate social responsibility (CSR) reporting in four Spanish industries and explore the relationship that exists between both concepts and an independent measurement of reputation for CSR (CSRR). The results demonstrate that the CSR reporting is especially relevant and useful in the finance industry. Finance companies report significantly more CSR information than most industries in Spain, and this reporting is more closely linked to their CSRR than the CSR reporting of basic, consumer goods and services industries. Borra

    Measurements of differential cross-sections in top-quark pair events with a high transverse momentum top quark and limits on beyond the Standard Model contributions to top-quark pair production with the ATLAS detector at √s = 13 TeV

    Get PDF
    Cross-section measurements of top-quark pair production where the hadronically decaying top quark has transverse momentum greater than 355 GeV and the other top quark decays into ℓνb are presented using 139 fb−1 of data collected by the ATLAS experiment during proton-proton collisions at the LHC. The fiducial cross-section at s = 13 TeV is measured to be σ = 1.267 ± 0.005 ± 0.053 pb, where the uncertainties reflect the limited number of data events and the systematic uncertainties, giving a total uncertainty of 4.2%. The cross-section is measured differentially as a function of variables characterising the tt¯ system and additional radiation in the events. The results are compared with various Monte Carlo generators, including comparisons where the generators are reweighted to match a parton-level calculation at next-to-next-to-leading order. The reweighting improves the agreement between data and theory. The measured distribution of the top-quark transverse momentum is used to search for new physics in the context of the effective field theory framework. No significant deviation from the Standard Model is observed and limits are set on the Wilson coefficients of the dimension-six operators OtG and Otq(8), where the limits on the latter are the most stringent to date. [Figure not available: see fulltext.]

    Measurement of the tt¯tt¯ production cross section in pp collisions at √s=13 TeV with the ATLAS detector

    Get PDF
    A measurement of four-top-quark production using proton-proton collision data at a centre-of-mass energy of 13 TeV collected by the ATLAS detector at the Large Hadron Collider corresponding to an integrated luminosity of 139 fb−1 is presented. Events are selected if they contain a single lepton (electron or muon) or an opposite-sign lepton pair, in association with multiple jets. The events are categorised according to the number of jets and how likely these are to contain b-hadrons. A multivariate technique is then used to discriminate between signal and background events. The measured four-top-quark production cross section is found to be 26+17−15 fb, with a corresponding observed (expected) significance of 1.9 (1.0) standard deviations over the background-only hypothesis. The result is combined with the previous measurement performed by the ATLAS Collaboration in the multilepton final state. The combined four-top-quark production cross section is measured to be 24+7−6 fb, with a corresponding observed (expected) signal significance of 4.7 (2.6) standard deviations over the background-only predictions. It is consistent within 2.0 standard deviations with the Standard Model expectation of 12.0 ± 2.4 fb

    Measurement of the energy asymmetry in t(t)over-barj production at 13 TeV with the ATLAS experiment and interpretation in the SMEFT framework

    Get PDF
    A measurement of the energy asymmetry in jet-associated top-quark pair production is presented using 139fb1139\,{\mathrm {fb}}^{-1} 139 fb - 1 of data collected by the ATLAS detector at the Large Hadron Collider during pp collisions at s=13TeV\sqrt{s}=13\,\text {TeV} s = 13 TeV . The observable measures the different probability of top and antitop quarks to have the higher energy as a function of the jet scattering angle with respect to the beam axis. The energy asymmetry is measured in the semileptonic ttˉt{\bar{t}} t t ¯ decay channel, and the hadronically decaying top quark must have transverse momentum above 350GeV350\,\text {GeV} 350 GeV . The results are corrected for detector effects to particle level in three bins of the scattering angle of the associated jet. The measurement agrees with the SM prediction at next-to-leading-order accuracy in quantum chromodynamics in all three bins. In the bin with the largest expected asymmetry, where the jet is emitted perpendicular to the beam, the energy asymmetry is measured to be 0.043±0.020-0.043\pm 0.020 - 0.043 ± 0.020 , in agreement with the SM prediction of 0.037±0.003-0.037\pm 0.003 - 0.037 ± 0.003 . Interpreting this result in the framework of the Standard Model effective field theory (SMEFT), it is shown that the energy asymmetry is sensitive to the top-quark chirality in four-quark operators and is therefore a valuable new observable in global SMEFT fits

    Constraints on Higgs boson production with large transverse momentum using H \rightarrow b\bar{b} decays in the ATLAS detector

    Get PDF
    This paper reports constraints on Higgs boson production with transverse momentum above 1 TeV. The analyzed data from proton–proton collisions at a center-of-mass energy of 13 TeV were recorded with the ATLAS detector at the Large Hadron Collider from 2015 to 2018 and correspond to an integrated luminosity of 136fb^{-1}. Higgs bosons decaying into b\bar{b} are reconstructed as single large-radius jets recoiling against a hadronic system and are identified by the experimental signature of two b-hadron decays. The experimental techniques are validated in the same kinematic regime using the Z \rightarrow b\bar{b} process. The 95% confidence-level upper limit on the cross section for Higgs boson production with transverse momentum above 450 GeV is 115 fb, and above 1 TeV it is 9.6 fb. The Standard Model cross section predictions for a Higgs boson with a mass of 125 GeV in the same kinematic regions are 18.4 fb and 0.13 fb, respectively
    corecore